Dynamical system

A set $\Omega$ whose elements are called states, and an object, called dynamical law, that tells us how to go from one state to the next one, i.e., they have a time evolution.

In the case of a discrete dynamical system, the dynamical law is a map

$$ F:\mathbb{N}\times \Omega\to \Omega $$

In this case, $\mathbb{N}$ is something like "discrete time". The function $F$ is something like a "semigroup action", similar to a group action but with $\mathbb N$.

The continuos dynamical systems, on the other hand, correspond to system of first order ODEs, whose solution is a 1-parameter local group of transformations, roughly speaking

$$ F:\mathbb{R}\times \Omega\to \Omega $$

They can be identified with vector fields of the form

$$ A=\partial_t+\sum \phi_j \partial{x_j} $$

The more important cases, for me, are the classical Hamiltonian systems.

They can be integrable systems.

________________________________________

________________________________________

________________________________________

Author of the notes: Antonio J. Pan-Collantes

antonio.pan@uca.es


INDEX: